Asymptotically Optimal Stochastic Encryption for Quantized Sequential Detection in the Presence of Eavesdroppers
نویسندگان
چکیده
We consider sequential detection based on quantized data in the presence of eavesdropper. Stochastic encryption is employed as a counter measure that flips the quantization bits at each sensor according to certain probabilities, and the flipping probabilities are only known to the legitimate fusion center (LFC) but not the eavesdropping fusion center (EFC). As a result, the LFC employs the optimal sequential probability ratio test (SPRT) for sequential detection whereas the EFC employs a mismatched SPRT (MSPRT). We characterize the asymptotic performance of the MSPRT in terms of the expected sample size as a function of the vanishing error probabilities. We show that when the detection error probabilities are set to be the same at the LFC and EFC, every symmetric stochastic encryption is ineffective in the sense that it leads to the same expected sample size at the LFC and EFC. Next, in the asymptotic regime of small detection error probabilities, we show that every stochastic encryption degrades the performance of the quantized sequential detection at the LFC by increasing the expected sample size, and the expected sample size required at the EFC is no fewer than that is required at the LFC. Then the optimal stochastic encryption is investigated in the sense of maximizing the difference between the expected sample sizes required at the EFC and LFC. Although this optimization problem is nonconvex, we show that if the acceptable tolerance of the increase in the expected sample size at the LFC induced by the stochastic encryption is small enough, then the globally optimal stochastic encryption can be analytically obtained; and moreover, the optimal scheme only flips one type of quantized bits (i.e., 1 or 0) and keeps the other type unchanged.
منابع مشابه
Monitoring Structural Changes in Econometric Models
The problem of sequential detection and diagnosis of structural changes in stochastic multivariate systems on the basis of sequential observations has many applications, including detection of changes in parameters of regression equations, testing adequacy of econometric models, fault detection and isolation in complex dynamical systems. There is an extensive statistical and econometric literat...
متن کاملOptimal sequential detection and isolation of changes in stochastic systems
The purpose of this paper is to give a new statistical approach to the change diagnosis (detection/isolation) problem. The change detection problem has received extensive research attention. On the contrary, change isolation is mainly an unsolved problem. We consider a stochastic dynamical system with abrupt changes and investigate the multihypothesis extension of Lorden's results. We introduce...
متن کاملQuantized Stochastic Gradient Descent: Communication versus Convergence
Parallel implementations of stochastic gradient descent (SGD) have received signif1 icant research attention, thanks to excellent scalability properties of this algorithm, 2 and to its efficiency in the context of training deep neural networks. A fundamental 3 barrier for parallelizing large-scale SGD is the fact that the cost of communicat4 ing the gradient updates between nodes can be very la...
متن کاملConstructive quadratic functional quantization and critical dimension
We propose a constructive proof for the sharp rate of optimal quadratic functional quantization and we tackle the asymptotics of the critical dimension for quadratic functional quantization of Gaussian stochastic processes as the quantization level goes to infinity, i.e. the smallest dimensional truncation of an optimal quantization of the process which is “fully" quantized. We first establish ...
متن کاملQSGD: Randomized Quantization for Communication-Optimal Stochastic Gradient Descent
Parallel implementations of stochastic gradient descent (SGD) have received significant research attention, thanksto excellent scalability properties of this algorithm, and to its efficiency in the context of training deep neural networks.A fundamental barrier for parallelizing large-scale SGD is the fact that the cost of communicating the gradient updatesbetween nodes can be ve...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1703.02141 شماره
صفحات -
تاریخ انتشار 2017